Web Survey Bibliography
Traditionally, business data for official statistics have been collected with paper questionnaires in self administrative surveys. Nowadays the paper questionnaire is more and more replaced by web questionnaires. A variety of strategies can be followed to introduce the web in business surveys. In Norway in 2004, and recently in Denmark, it has been decided that all business surveys should be transformed to the web quickly. In the Netherlands an effort has been made to develop a well-designed web questionnaire: the Structural Business Survey questionnaire was fully designed and pre-tested in a two-year period. The result was supposed to serve as an example for all other surveys. A driving force behind this development is a common interest by the surveyors and those who are surveyed to reduce the manpower, and hence the costs of business surveys (including response burden).
But neither these ambitions nor quality improvements come automatically with technological innovations. At international conferences, workshops and meetings, we find that many methodologists are struggling with the implementation of these technologies. In February 2010, methodologist from 8 European countries met in Copenhagen to discuss how common EU-regulated surveys best can be transferred from paper to web (both for business and social surveys; the focus was on business surveys). The idea for this meeting was born when data collection methodologists from Statistics Denmark visited Statistics Netherlands in May 2009 to discuss web questionnaire designs. The initiative to organise this meeting was taken at the 2009 ISM Workshop in Bergamo. In follow-up to the Copenhagen meeting, this topic was also at the agenda of the Eurostat Working Group of Statistical Quality in June 2010. Here it was decided to discuss the need for an action plan and concrete projects with the Directors of Methodology.
This 2011 ISM presentation is a follow-up of the Copenhagen initiative, and is meant to report back to the participants what has been done. In the presentation we will give an overview of the issues that have been discussed, and relate those to non-sampling errors like non-response and measurement issues, as well as response burden. We would like to discuss with the audience how the Copenhagen initiative and the issues raised best could be followed up.
Issues that have been discussed (and which have relations to other presentations in the Workshop) are:
– An issue that is discussed over and over again is how to get sampled units pick-up the web questionnaire: What strategies should be used to increase the take-up rate? Should a paper questionnaire still be available, and presented?
– Where are we when it comes to guidelines in how to design web questionnaires? One much discussed issue under this headline is how similar or different web and paper questionnaires in a mixed mode data collection design should be. When respondents use a web questionnaire, they expect it to have some intelligence. What do respondents expect and what guidelines can be given to make the questionnaire respondent friendly? Issues here are e.g. the use of matrix questions and edit checks that help to get good data quality but may also result in aborting the completion of the questionnaire. Another issue is the use of historic data in the questionnaire (comparable to dependent interviewing).
– Talking about mixed-mode designs: How to deal with mode effects?
– Technology issues include e.g. how to deal with the variety of software browsers?
– How to implement web questionnaires? When moving to the web, Statistics Netherlands on the one side, and Statistics Norway and Statistics Denmark on the other, adopted different approaches (as discussed above). What did we learn from these two approaches?
– Once a questionnaire has been developed, the issue is: How do new methods affect pre-tests and the ability to monitor the response process?
During the development of web questionnaires traditional cognitive interviewing and the techniques of usability studies can be combined, e.g. by using paradata and eye-tracking during individual tests. Computerized questionnaires also opens the possibility to monitor the response process in a detailed way while conducting the survey (using paradata), both at the level of overall response rates as well at the level of individual respondents (using audit trails).
In our presentation we will focus in some more detail on web pick-up issues and response burden issues.
Issues that have not yet been addressed, but which are important and can be discussed at the workshop, are:
– How to organising the data collection and logistics for web and mixed-mode surveys?
– What software should be used: develop ones own software or use software that is available in the market?
– How to organise research, collaborate with universities, and bring in the literature?
Workshop Homepage (abstract) / (presentation)
Web survey bibliography (4086)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not...; 2017; Toepoel, V.; Emerson, H.
- Mind the Mode: Differences in Paper vs. Web-Based Survey Modes Among Women With Cancer; 2017; Hagan, T. L.; Belcher, S. M.; Donovan, H. S.
- Answering Without Reading: IMCs and Strong Satisficing in Online Surveys; 2017; Anduiza, E.; Galais, C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Social desirability bias in self-reported well-being measures: evidence from an online survey; 2017; Caputo, A.
- Web-Based Survey Methodology; 2017; Wright, K. B.
- Handbook of Research Methods in Health Social Sciences; 2017; Liamputtong, P.
- Lessons from recruitment to an internet based survey for Degenerative Cervical Myelopathy: merits of...; 2017; Davies, B.; Kotter, M. R.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Telephone versus Online Survey Modes for Election Studies: Comparing Canadian Public Opinion and Vote...; 2017; Breton, C.; Cutler, F.; Lachance, S.; Mierke-Zatwarnicki, A.
- Examining Factors Impacting Online Survey Response Ratesin Educational Research: Perceptions of Graduate...; 2017; Saleh, A.; Bista, K.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary...; 2017; Meitinger, K.
- Nonresponse in Organizational Surveying: Attitudinal Distribution Form and Conditional Response Probabilities...; 2017; Kulas, J. T.; Robinson, D. H.; Kellar, D. Z.; Smith, J. A.
- Theory and Practice in Nonprobability Surveys: Parallels between Causal Inference and Survey Inference...; 2017; Mercer, A. W.; Kreuter, F.; Keeter, S.; Stuart, E. A.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Reducing speeding in web surveys by providing immediate feedback; 2017; Conrad, F.; Tourangeau, R.; Couper, M. P.; Zhang, C.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- A Working Example of How to Use Artificial Intelligence To Automate and Transform Surveys Into Customer...; 2017; Neve, S.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.